Matrix Optimization Over Low-Rank Spectral Sets: Stationary Points and Local and Global Minimizers

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-rank spectral optimization

Various applications in signal processing and machine learning give rise to highly structured spectral optimization problems characterized by low-rank solutions. Two important examples that motivate this work are optimization problems from phase retrieval and from blind deconvolution, which are designed to yield rank-1 solutions. An algorithm is described based on solving a certain constrained ...

متن کامل

The Global Optimization Geometry of Low-Rank Matrix Optimization

In this paper we characterize the optimization geometry of a matrix factorization problem where we aim to find n×r and m×r matrices U and V such that UV T approximates a given matrixX. We show that the objective function of the matrix factorization problem has no spurious local minima and obeys the strict saddle property not only for the exact-parameterization case where rank(X) = r, but also f...

متن کامل

Global Optimality of Local Search for Low Rank Matrix Recovery

We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrixrecovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to aglobal optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergenceguarantee for stochastic gradient descent ...

متن کامل

Local Low-Rank Matrix Approximation

Matrix approximation is a common tool in recommendation systems, text mining, and computer vision. A prevalent assumption in constructing matrix approximations is that the partially observed matrix is of low-rank. We propose a new matrix approximation model where we assume instead that the matrix is locally of low-rank, leading to a representation of the observed matrix as a weighted sum of low...

متن کامل

Fixed-rank matrix factorizations and Riemannian low-rank optimization

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2019

ISSN: 0022-3239,1573-2878

DOI: 10.1007/s10957-019-01606-8